home *** CD-ROM | disk | FTP | other *** search
Text File | 1993-07-15 | 42.3 KB | 1,270 lines |
-
-
-
-
-
-
- Recommendation X.403
-
- MESSAGE HANDLING SYSTEMS - CONFORMANCE TESTING
-
- The CCITT,
-
- considering
-
- (a) the need for Message Handling Systems;
-
- (b) the need to ensure the interoperability of Message
- Handling Systems;
-
- (c) the need for conformance testing specifications for
- Message Handling Systems;
-
- (d) that the X.400-Series Recommendations specify
- Message Handling Systems;
-
- (e) the state-of-the-art of OSI testing methodology and
- notation within CCITT-ISO,
-
- unanimously declares
-
- (1) that this Recommendation describes the testing
- methodology for Message Handling Systems;
-
- (2) that this Recommendation describes a notation used
- to define test specifications for Message Handling
- Systems;
-
- (3) that this Recommendation describes the scope and
- content of CCITT Conformance Testing Specification
- Manuals for Message Handling Systems.
-
-
- CONTENTS
-
- 0. Introduction
-
- 1. Scope and Field of Application
-
- 2. References
-
- 3. Definitions
-
- 4. Abbreviations
-
- 5. Conventions
-
- 6. Overview
-
- 7. Conformance requirements
-
- 8. Testing methodology
-
- 9. Structure of test suites
-
- 10. Information to be supplied by implementors
-
-
-
-
-
-
-
-
-
-
-
-
- 11. Test Notation
-
- 12. Conformance Assessment Procedures
-
- Annex A Test Notation
-
- Annex B IPMS(P2) PICS Proformas
-
- Annex C MTS(P1) PICS Proformas
-
- Annex D RTS PICS Proformas
-
-
-
- 0. Introduction
-
- This Recommendation describes the test methods, test criteria and
- test notation to be used for the conformance testing of message
- handling systems based on the 1984 X.400 series of Recommendations
- as supplemented by the X.400-Series Implementor's Guide
- (version 5).
-
- 1. Scope and Field of Application
-
- The message handling protocols in the scope of this Recommendation
- are contained in the 1984 X.400-Series of Recommendations together
- with the X.400 series Implementor's Guide (version 5).
-
- Abstract test specifications for these are contained in the CCITT
- Conformance Testing Specification Manuals associated with this
- Recommendation:
-
- - Conformance Testing Specification Manual for IPMS(P2)
- - Conformance Testing Specification Manual for MTS(P1)
- - Conformance Testing Specification Manual for RTS
-
- Even though these Manuals are referred to by this Recommendation
- they are not part of it.
-
- While the complete and correct operation of session, transport and
- other lower-layer protocols is required for interworking the
- testing of these layers is not in the scope of this Recommendation.
- On the other hand, X.400 conformance tests should verify that the
- Reliable Transfer Server (RTS) correctly uses the layers beneath
- it.
-
- The tests defined in this document apply to inter-domain working
- (ADMD to ADMD and ADMD to PRMD). They relate to any MTA or UA in a
- domain that supports communications with other domains.
-
- Conformance testing of the semantics and syntax of the actual body
- part information carried in a BODY PART is beyond the scope of this
- document.
-
- The purpose of this Recommendation is to minimize the time and
- expense that manufacturers of X.400 implementations and providers
- of X.400 services must incur to ensure a high degree of
- interoperability of their equipment. This purpose is achieved by
-
-
-
-
-
-
- having a set of X.400 conformance test specifications. The
- successful joint execution of the test specifications by two
- implementations can be accepted as compelling evidence of the
- complete and correct operation of these implementations.
-
- The scope and intention of this Recommendation is different from
- other CCITT Recommendations which define communication services and
- protocols such as the 1984 X.400-Series of Recommendations. The
- purpose of the latter Recommendations is to unambiguously define a
- system. However a Recommendation for conformance testing provides
- a well chosen subset of tests of the virtually infinite
- number of tests needed to guarantee full compliance to a protocol
- standard. The subset is chosen in such a way that it gives a high
- level of confidence that tested implementations will interwork
- while taking into account pragmatic considerations such as time
- taken to perform the tests.
-
- Testing for conformance to functional standards is beyond the scope
- of this Recommendation. However it is recognized that conformance
- tests for functional standards can be derived from this
- Recommendation and the associated Test Specification Manuals.
-
- It should be recognized that the conformance testing of message
- handling systems may fall within the framework of national
- regulations and may be subject to the testing policies of
- Administrations which are beyond the scope of this document.
-
- 2. References
-
- X.400 Message Handling Systems: System Model-Service Elements,
- version 1984.
-
- X.401 Message Handling Systems: Basic service elements and
- optional user facilities, version 1984.
-
- X.408 Message Handling Systems: Encoded information type
- conversion rules, version 1984.
-
- X.409 Message Handling Systems: Presentation transfer syntax
- and notation, version 1984.
-
- X.410 Message Handling Systems: Remote operations and
- reliable transfer server, version 1984.
-
- X.411 Message Handling Systems: Message transfer layer,
- version 1984.
-
- X.420 Message Handling Systems: Interpersonal messaging
- user agent layer, version 1984.
-
- X.210 Open Systems Interconnection (OSI) Layer Service
- Definitions Convention, version 1984.
-
- X.400 Series (1984) Implementor's Guide version 5.
-
- 3. Definitions
-
- 3.1 Service Convention Definitions
-
-
-
-
-
-
-
-
-
-
-
-
- This Recommendation makes use of the following terms defined in
- Recommendation X.210, version 1984:
-
- a) primitive;
-
- b) request (primitive);
-
- c) indication (primitive);
-
- d) response (primitive);
-
- e) confirm (primitive).
-
-
- 3.2 Message Handling Definitions
-
- This Recommendation makes use of the following terms defined in
- Recommendation X.400, version 1984:
-
- a) administration management domain;
-
- b) interpersonal message [X.420];
-
- c) message;
-
- d) message transfer [X.411];
-
- e) originator;
-
- f) private management domain;
-
- g) recipient;
-
- h) user.
-
- 4. Abbreviations
-
- The following abbreviations are used in this Recommendation:
-
- ADMD Administration management domain;
-
- ASP Abstract Service Primitive;
-
- DSE Distributed Single layer Embedded testmethod;
-
- MHS Message Handling System;
-
- IPMS Interpersonal Messaging System;
-
- IUT Implementation Under Test;
-
- MPDU Message Protocol Data Unit;
-
- MT Message Transfer;
-
- MTA Message Transfer Agent;
-
- MTS Message Transfer System;
-
-
-
-
-
-
-
- P1 The Message Transfer Protocol [X.411];
-
- P2 The Interpersonal Messaging Protocol [X.420];
-
- PCO Point of Control and Observation;
-
- PICS Protocol Implementation Conformance Statement;
-
- PIXIT Protocol Implementation Extra Information for Testing;
-
- PDU Protocol data unit;
-
- PRMD Private management domain;
-
- RTS Reliable Transfer Server;
-
- SAP Service Access Point;
-
- TSP Test Suite Parameter;
-
- TTCN Tree and Tabular Combined Notation;
-
- UA User Agent.
-
- 5. Conventions
-
- No conventions are defined for this Recommendation.
-
- 6. Overview
-
- There are two kinds of CCITT documents concerned with
- X.400 Conformance testing:
-
- (a) This CCITT Recommendation entitled "X.403 Message Handling
- Systems: Conformance Testing".
-
- (b) Three associated CCITT Conformance Testing Specification
- Manuals entitled:
-
- - Conformance Testing Specification Manual for IPMS(P2)
- - Conformance Testing Specification Manual for MTS(P1)
- - Conformance Testing Specification Manual for RTS
-
- The CCITT Recommendation is intended for a wide readership. The
- Manuals are intended for test implementors and contain detailed
- test specifications.
-
-
- 6.1 The X.400 Conformance Testing Recommendation
-
- This Recommendation gives the following information:
-
- (a) Conformance requirements of X.400 implementations.
-
- (b) The testing methodology.
-
- (c) The structure of the test specifications.
-
- (d) Information to be supplied by implementors as a prerequisite to
-
-
-
-
-
-
-
-
-
-
-
- conformance testing.
-
- (e) The test notation.
-
- (f) Conformance assessment procedures.
-
- 6.2 The X.400 Conformance Testing Specification Manuals
-
- Three CCITT Conformance Testing Specification Manuals contain test
- specifications for the IPMS(P2), MTS(P1), RTS. The test
- specifications are written in a notation described in general terms
- in clause 11. The Conformance Testing Specification Manuals are
- referred to by this Recommendation but they are not part of it.
-
- Since the Manuals contain detailed and unambiguous test
- specifications, users of these Manuals should be familiar with the
- X.400-Series of Recommendations and with the testing methodology
- used.
-
- 7. Conformance requirements
-
- The purpose of the test specifications referenced by this
- Recommendation is to define tests that will establish to a high
- degree of confidence that the various protocol layers of an
- implementation under test conform to the requirements of the X.400
- series of Recommendations (1984).
-
- A system claiming to conform to the X.400 IPM-Service has to
- support correctly:
-
- - the basic IPM service elements as defined in X.400/Table 2
-
- - the IPM Optional User facilities defined as Essential in
- X.401/Table 1 and Table 2 (where the categorization for
- origination and reception should be considered)
-
- - the IPM Optional User facilities defined as Additional in
- X.401/Table 1 and Table 2, which are claimed to be supported
-
- - the requirements related to the IPM service as defined
- in version 5 of the CCITT X.400-Series Implementor's Guide.
-
- A system claiming to conform to the X.400 MT-service has to
- support correctly:
-
- - the basic MT-service elements as defined in X.400/Table 1
- related to the MTS(P1) protocol
-
- - the MT Optional User facilities defined as Essential in
- X.401/Table 3 and 4 and related to the MTS(P1) protocol
-
- - the MT Optional User facilities defined as Additional in
- X.401/Table 3 and 4 and related to the MTS(P1) protocol,
- which are claimed to be supported
-
- - the requirements related to the P1 MT-service as defined
- in version 5 of the CCITT X.400-Series Implementor's Guide.
-
- A system claiming to conform to the X.400 RTS-service has to
-
-
-
-
-
-
- support correctly:
-
- - the RTS-services as defined in X.410
-
- - the requirements related to the RTS-Service as defined in
- version 5 of the CCITT X.400-Series Implementor's Guide.
-
- Claims of conformance of an implementation to the X.400-Series of
- Recommendations can be tested using the Conformance Testing
- Specification Manuals associated with this Recommendation to
- ensure that:
-
- (a) The implementation does not act or react in a way different to
- the one described in the Recommendations.
-
- (b) The implementation is capable of handling protocol errors.
-
- The reaction of an implementation on receipt of protocol errors
- is not defined in the X.400 Series of Recommendations. For the
- purpose of conformance testing the minimum additional
- requirement is made that the implementation subsequently
- continues to operate normally in such cases.
-
- The absence of a mandatory protocol element in P2 or P1 is
- regarded as a protocol error. It should be noted that in an
- implemented MHS a recipient domain may choose to deliver an
- incorrect MPDU. This should be considered as proprietary design
- by the equipment vendor, and the specific actions taken in these
- situations are defined by the vendor and not subject to
- conformance.
-
- (c) The implementation correctly handles the requirements defined
- in X.400 Implementor's Guide Version 5.
-
- Maximum lengths and maximum number of occurrences are
- interpreted in the following way:
-
- - on origination: the implementation may support maximum
- lengths/occurrences up to but not exceeding the constraint
- value.
-
- - on reception: the implementation must support the maximum
- lengths/occurrences of the constraints. Values above the
- constraints may be supported but the conformance requirements
- on the implementation upon reception of a length/occurrence
- exceeding the constraint are the same as for protocol errors.
-
- Claims of conformance to the X.400 series of Recommendations can
- not be tested for those implementations for which it is not
- possible to perform all the required tests for features labeled
- mandatory, basic or essential optional.
-
-
- 8. Testing methodology
-
- 8.1 Test configurations
-
- Two test configurations are used. The first configuration is shown
- in Figure 1/X.403 and is used to test IPMS(P2), MTS(P1) and RTS.
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
- Figure 1/X.403 End system configuration.
-
-
- The second configuration is shown in Figure 2/X.403 and is used to
- test the relay aspects of the MTS(P1) protocol.
-
-
-
-
-
-
-
-
-
-
-
-
- Figure 2/X.403 Relaying MTA test configuration.
-
-
- 8.2 Points of Control and Observation
-
- Test cases are described abstractly in terms of events at points of
- control and observation (PCO) in both the tester and the
- implementation under test (IUT). These PCOs are generally Service
- Access Points (SAPs) and the events are generally Abstract Service
- Primitives (ASPs). This does not imply that manufacturers are
- required to have accessible SAPs or to implement ASPs within their
- systems. During test execution the PCOs of an IUT may be accessed
- indirectly through a user interface. Where testing is performed
- through a user interface, the mapping of events between the SAP and
- the user interface is provided by the supplier of the IUT as
- described in clause 10.2.
-
- 8.2.1 PCOs for IPMS(P2)
-
- The IPMS(P2) test cases are described using the Points of Control
- and Observation (PCOs) shown in Figure 3/X.403:
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
- Figure 3/X.403 Points of control and observation for IPMS(P2).
-
-
- For the tester, the Point of Control and Observation is the Service
- Access Point (SAP) defined at the boundary between the User Agent
- Layer and the Message Transfer Layer. This PCO makes use of the
- Message Transfer Layer Service Primitives defined in
- Recommendation X.411.
-
- For the IUT, the PCO is the SAP defined at the upper boundary of
- the User Agent Layer. However Recommendation X.420 does not include
- definition of Service Primitives and it has therefore been
- necessary to construct hypothetical ones for sending and receiving
- IP-messages, in order that the test cases can be described in a
- formal way.
-
-
- 8.2.2 PCOs for MTS(P1)
-
- The MTS(P1) test cases are described using the PCOs shown in
- Figure 4/X.403:
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
- Figure 4/X.403 Points of control and observation for MTS(P1).
-
-
- For the tester, the PCO is the SAP defined at the boundary between
- the MT Layer and the RTS. This PCO makes use of the RTS primitives
- defined in Recommendation X.410.
-
- For the IUT, the PCO is the SAP defined at the boundary between the
- UA Layer and the MT Layer. This PCO makes use of the MT Service
- Primitives defined in Recommendation X.411.
-
- The testing of relay functions requires more than one tester SAP.
- Similarly the testing of multiple destination delivery requires
- more than one UA on the IUT.
-
- 8.2.3 PCOs for RTS
-
-
-
-
-
-
-
-
-
-
-
-
- The RTS test cases are described using the PCOs shown in
- Figure 5/X.403:
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
- Figure 5/X.403 Points of control and observation for RTS.
-
-
- For the tester, the PCO is the SAP defined at the boundary between
- the RTS and the Session Layer. This PCO makes use of the Session
- Service Primitives defined in Recommendation X.215.
-
- For the IUT, the PCO is the SAP defined at the upper boundary of
- the User Agent Layer. This PCO makes use of the same hypothetical
- Service Primitives defined for IPMS(P2) (section 8.2.1).
-
- The description of the RTS test cases includes events at a third
- SAP at the IUT (SAP-I) between the MT Layer and RTS. The events of
- this SAP are used only for clarification and it is not used as a
- PCO.
-
-
-
-
- 8.3 Test Design Strategy
-
- The MHS test specifications are designed using the following
- concepts:
-
- (a) A test specification is defined as a test suite composed of a
- number of test cases as defined in clause 11.1.
-
- (b) Test cases are defined in terms of
-
- - lower layer ASP events at the tester
- - upper layer ASP events at the IUT
-
- (c) The test cases define the sequencing of these ASP events and
- the associated parameters, in particular the PDUs.
-
- (d) Test cases for valid behaviour specify ASP event sequences and
-
-
-
-
-
-
- PDUs that are in accordance with the X.400 series of
- Recommendations.
-
- (e) Test cases for invalid behaviour are characterized by:
-
- - A correct PDU or event initiated by the tester in a protocol
- state where it is not permitted (an inopportune event), or
-
- - a correct PDU incorporating an element which is
- syntactically correct and in range, but conflicts with the
- negotiated value, or
-
- - a PDU sent by the tester which is syntactically incorrect
- (examples are a missing mandatory protocol element, an out-of-
- range value or an incorrectly encoded length indicator) or
-
- - for RTS a lower layer ASP event issued by the tester used with
- parameters that are not allowed or not appropriate (example
- SPSN in SConnect) by X.400 restrictions.
-
- (f) The depth of testing is restricted to a reasonable
- number of test cases using the following principles:
-
- For valid behaviour:
-
- - If there is a small number of valid protocol element values,
- test all of them.
-
- - If there is a range of values, test the bounds and a few
- common values.
-
- - If there are no bounds, test an extreme value besides the
- common ones.
-
- For invalid behaviour:
-
- - The number of test cases for a particular type of error is
- reduced to one or just a few common ones.
-
- 8.3.1 Strategy for X.409 testing
-
- The X.409 test cases defined in the CCITT Conformance Testing
- Specification Manuals associated with this Recommendation are
- applicable only to X.400 message handling systems. The testing of
- X.409 is done as part of the MTS(P1), IPMS(P2) and RTS testing. The
- features tested are the data types defined in X.409, the various
- forms of length encoding and the use of primitive and constructor
- data elements. To increase the likelihood that the tests can be
- performed, the test cases wherever possible have been defined using
- the protocol elements associated with mandatory service elements.
-
- Two categories of X.409 tests are identified:
-
- - Decoding Tests
-
- These tests are constructed by identifying X.409 features to be
- exercised and devising sets of correctly and incorrectly encoded
- test PDUs containing these features. The tests are performed by
- transmitting the test PDUs to the IUT and observing the local
-
-
-
-
-
-
-
-
-
-
-
- reaction of the implementation and/or any PDUs returned to the
- tester.
-
- - Encoding Tests
-
- These tests are constructed by identifying a set of user service
- requests that will generate PDUs whose encoding will exercise
- major X.409 features. The tester must check the validity of the
- coding of the resulting PDUs generated by the IUT.
-
- The decoding tests allow the X.409 decoding features of an
- implementation to be fully exercised using valid and invalid test
- PDUs. Encoding tests only allow the valid behaviour of X.409
- encoding to be checked.
-
- 8.3.2 Strategy for IPMS(P2) testing
-
- Two categories of test are identified :
-
- - IUT as originator
- - IUT as recipient
-
- With the IUT as originator, for each service element supported by
- the implementation, tests are performed by:
-
- - Invoking the service.
- - The tester checking the validity of the resulting PDUs.
- - Where appropriate the tester returning valid and invalid
- response PDUs to the originator.
-
- With the IUT as recipient, for each service element, tests are
- performed by:
-
- - The tester sending valid and invalid PDUs for that service.
- - Observing the local reaction of the UA.
- - Checking the validity of any further PDUs generated by the UA.
-
- In order to avoid unnecessary duplication of test cases, IPM
- service elements which are also MT service elements (for instance
- Delivery Notification) are listed in the MTS(P1) test suite in
- conjunction with the corresponding MT service elements, and not in
- the IPMS(P2) test suite.
-
- It is assumed that the testing of the MT layer is done through a
- User Agent.
-
- 8.3.3 Strategy for MTS(P1) testing
-
- When testing the operation of a MTS(P1) implementation five
- categories of tests are identified.
-
- - IUT as originator
- - IUT as recipient
- - IUT as relay
- - IUT as relay recipient
- - IUT as recipient/originator
-
- With the IUT as originator, for each service element supported by
- the implementation, tests are performed by:
-
-
-
-
-
-
-
- - Invoking the service.
- - Checking the validity of the resulting PDUs.
-
- With the IUT as recipient, for each service element supported by
- the implementation, tests are performed by:
-
- - The tester sending valid and invalid PDUs for that service.
- - Observing the local reaction of the UA.
- - Checking the validity of any further PDUs generated by the UA.
-
- With the IUT as relay, for each service element tests are
- performed by:
-
- - The tester sending valid and invalid PDUs for relaying.
- - Checking the validity of the reaction of the IUT.
-
- With the IUT as a relay recipient, for each service element tests
- are performed by:
-
- - Sending a set of valid and invalid PDUs destined for more than
- one recipient. At least one of these recipients is attached to
- the IUT and a further recipient is attached to a remote MTA such
- that the IUT has to relay the message.
-
- - Checking the validity of the reaction of the IUT as recipient.
-
- - Checking that the PDUs that are relayed are not corrupted and
- are modified appropriately.
-
- With the IUT as a recipient/originator, for each service element
- supported by the implementation, tests are performed by:
-
- - Invoking the IUT to send a message to multiple recipients.
- At least one recipient will be attached to the IUT itself and a
- further recipient will be attached to a remote MTA.
-
- - Checking the validity of the reaction of the IUT as recipient.
-
- - Checking the validity of the PDUs transmitted by the IUT.
-
- 8.3.4 Strategy for RTS testing
-
- The following testing phases are used:
-
- (a) The connection/association establishment and negotiation phase.
-
- The X.410 Recommendation allows different negotiable options and
- the negotiation phase is tested exhaustively using valid and
- invalid elements.
-
- (b) The orderly release of the connection/association.
-
- Only a few tests are required to check the correct
- implementation of the RTS release features.
-
- (c) The data transfer phase with token exchange.
-
- The data transfer tests check:
-
-
-
-
-
-
-
-
-
-
-
-
- - The correct operation of data transfer using the negotiated
- values.
-
- - The correct operation of token exchange.
-
- - The correct confirmation of confirmed services.
-
- - The correct reaction to invalid (eg non-negotiated) elements.
-
- (d) Recovery
-
- Tests are performed to check that an IUT can perform correct
- recovery after:
-
- - User aborts
-
- - Provider aborts
-
- - Exception reports
-
- - Not acknowledged checkpoints
-
- 9. Structure of test suites
-
- The IPMS(P2) and MTS(P1) test suites have a common structure which
- differs from that of the RTS test suites.
-
- 9.1 Structure of IPMS(P2) and MTS(P1) test suites
-
- The IPMS(P2) and MTS(P1) test suites consist of five groups of test
- cases:
-
- (a) Initial Tests
-
- The Initial Tests check mandatory features in a small number of
- test cases. They have been defined in order to check that the
- implementation correctly supports the main mandatory features
- and that it is sensible to continue with full conformance
- testing.
-
- (b) X.409 Tests
-
- The X.409 Tests check the IUT's encoding and decoding of
- protocol elements. Decoding tests are performed by transmitting
- test PDUs to the IUT. Encoding tests are performed by checking
- PDUs received from the IUT.
-
- (c) Protocol Element tests
-
- Protocol Element tests identify test purposes for every protocol
- element in the IPMS(P2)/MTS(P1) protocols. This is important in
- ensuring a full test coverage for the IPMS(P2)/MTS(P1)
- protocols. Many of these tests are necessarily performed as part
- of the Service Element tests.
-
- (d) Service Element tests
-
- Service Element tests check the capability of the IUT to support
-
-
-
-
-
-
- the service elements in X.400. Some of these tests are carried
- out in the initial tests and the X.409 tests. Service Element
- tests include both tests for specific service elements and tests
- for combinations of interdependent service elements.
-
- (e) Additional Test
-
- The Additional Test group checks features not covered in the
- other test groups.
-
- As indicated in (a) to (e) above the number of test cases has been
- minimized by taking advantage of the fact that the performance of a
- given test case may cover more than one test purpose. Figure
- 6/X.403 shows how some of the test purposes identified in a
- particular test group may actually be achieved by test cases in
- another group.
-
-
-
-
-
-
-
-
-
-
-
-
- Figure 6/X.403 Structure of IPMS(P2) and MTS(P1) test suites.
-
-
- 9.2 Structure of RTS test suites
-
- The RTS test suite is made up of five groups of test cases:
-
- - Association Establishment Tests
- - Association Release Tests
- - Data Transfer Tests
- - Association Recovery Tests
- - X.409 Tests
-
- The Association Establishment Tests check the negotiation of the
- connection elements.
-
- The Association Release Tests check the orderly release of
- associations.
-
- The Data Transfer Tests check that data is transferred correctly in
- accordance with the values of the connection elements negotiated
- during association establishment.
-
- The Association Recovery Tests check that the IUT can recover from
- breaks in connection both inside and outside activities.
-
- The X.409 Tests check the IUT's encoding and decoding of Session
- Service User Data.
-
- 10. Information to be supplied by implementors
-
-
-
-
-
-
-
-
-
-
-
-
- 10.1 Protocol Implementation Conformance Statement (PICS)
-
- The Protocol Implementation Conformance Statement (PICS) is
- information supplied by an implementor that specifies the protocol
- features implemented in a Message Handling System.
-
- This information is used during conformance testing:
-
- - To check that the protocol features that have been implemented
- are consistent with the conformance requirements, in terms of
- optional and mandatory features, of the X.400 series
- Recommendations.
-
- - To select the originator tests to be executed. Recipient and
- relay tests will be performed to check the behaviour of the
- system even when it is requested to handle features that it does
- not implement.
-
- PICS proformas for IPMS(P2), MTS(P1) and RTS are shown in Annex B,
- C and D. These proformas specify the information to be supplied by
- an implementor concerning:
-
- - The services that are supported for origination, reception and
- relay functions.
-
- - The protocol features that have been implemented in order to
- support the services.
-
- The IPMS (P2) PICS explicitly includes the MTS (P1) service elements
- made available by the IPMS (P2). In order to avoid
- duplication with the MTS (P1) test suite, tests for such MTS (P1)
- service elements are not contained in the IPMS (P2) test
- suite. Where the testing of MTS (P1) is not performed using a
- UA, MTS (P1) tests may need to be repeated using a UA in
- order to ensure conformance to the IPMS (P2).
-
- 10.2 Protocol Implementation Extra Information for Testing (PIXIT)
-
- The Protocol Implementation eXtra Information for Testing (PIXIT)
- is supplied by an implementor specifying information needed by a
- tester to execute a test suite.
-
- The IPMS(P2), MTS(P1) and RTS test suites define the behaviour of
- the implementation in terms of abstract service primitives. In
- order to invoke and observe this behaviour during test execution
- the test operator must know how (if at all) these abstract service
- primitives can be invoked or observed at the real accessible user
- interface.
-
- The IPMS(P2), MTS(P1) and RTS PIXIT proformas will list all the IUT
- upper layer abstract service primitives used in the test
- definitions and will ask the implementor to specify how these
- primitives can be invoked or observed (if at all).
-
- 11. Test Notation
-
- 11.1 Definitions
-
- The notation used to define the MHS test specifications makes use
-
-
-
-
-
-
- of the following definitions:
-
- (a) Test Suite
-
- A set of test cases, possibly combined into nested test groups,
- necessary to perform conformance testing of an implementation.
-
- The test suites do not imply an order of execution.
-
- (b) Test Group
-
- A set of related test cases. Test groups may be nested to
- provide a logical structuring of test cases.
-
- (c) Test Case
-
- Specifies the sequences of test events required to achieve the
- purpose of the test and to assign a verdict "pass", "fail" or
- "inconclusive".
-
- (d) Test Event
-
- An indivisible unit of test specification at the level of
- abstraction of the specification (e.g. sending or receiving a
- single PDU).
-
- (e) User
-
- A user-interface process or a computer application which makes
- use of an MHS.
-
- 11.2 Notation
-
- The Conformance Test Suites for Message Handling Systems use the
- Tree and Tabular Combined Notation as described in Annexe A of this
- Recommendation.
-
- Each test suite specification is defined in six sections:
-
- 1 Introduction
-
- This contains an overview describing the scope of the tests and
- the structure of the test suite.
-
- 2 Summary of Test cases
-
- This is a list of all tests giving the test identifier, the
- test reference and a short title for each test case in the test
- suite.
-
- 3 Declarations Part
-
- Declares the names and types of all items to be used in
- defining the test cases.
-
- 4 Dynamic Part
-
- This is the main body of the test suite and defines test cases
- in terms of trees of behaviour.
-
-
-
-
-
-
-
-
-
-
-
-
- 5 Constraints Part
-
- Specifies the values of the ASPs and PDUs used in the Dynamic
- Part.
-
- 6 Cross references
-
- Provides an index to all values used in the main body of the
- test suite.
-
- 12. Conformance Assessment Procedures
-
- This Recommendation deals only with abstract test specifications
- for Message Handling Systems. It does not deal with the realization
- of these test specifications nor with their execution. This clause
- in the Recommendation is purely for information purposes to
- describe in general terms how real testing may be done.
-
- 12.1 Overview of the Procedure
-
- The procedures needed to assess the conformance of an
- implementation include:
-
- - The completion of the PICS and PIXIT proformas by the supplier
- of the implementation.
-
- - The assessment of these documents.
-
- - The selection and execution of test cases.
-
- - The analysis of the results and the production of test reports.
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
- Figure 7/X.403 The Conformance Assessment Procedure.
-
-
- 12.2 Analysis of PICS
-
- The first phase in conformance assessment is to ensure that the
- features claimed to be supported by an IUT comply with appropriate
- conformance requirements. The conformance requirements for
- IPMS(P2), MTS(P1) and RTS implementations are defined in clause 7
- of this document. This check is performed by analysing the
- information in the PICS documents.
-
- 12.3 Test Case Selection
-
- The tests to be performed are selected primarily on the basis of
- information in the PICS. For every supported feature claimed in the
- PICS the corresponding test cases in the test suites are selected
- and executed to check the correct implementation of these features
- under an extensive range of valid and invalid conditions.
-
- For non-supported features, some recipient test cases shall be
- executed to explore the response of the IUT. Since in general the
- X.400 (1984) Series of Recommendations do not define the expected
- behaviour in these situations, these tests can be "passed" with
- almost any behaviour apart from catastrophic failure by the IUT.
-
- Information in the PIXIT may also provide some constraints on the
- test cases that can be executed.
-
- 12.4 Execution of Tests
-
- It is recommended that the testing of Message Handling Systems
- should be done in the order of RTS, then MTS(P1) and then IPMS(P2)
- testing.
-
- However the order of test cases in the test suites does not imply
- an order of execution. Apart from the general recommendation that
- for IPMS(P2)/MTS(P1) the Initial Test Group should be executed
- first, the order of execution of tests can be determined by the
- test operators taking into account their test environment and test
- tools.
-
-
-
-
-
-
-